A Universal Approximation Theorem for Mixture-of-Experts Models
نویسندگان
چکیده
منابع مشابه
A Universal Approximation Theorem for Mixture-of-Experts Models
The mixture-of-experts (MoE) model is a popular neural network architecture for nonlinear regression and classification. The class of MoE mean functions is known to be uniformly convergent to any unknown target function, assuming that the target function is from a Sobolev space that is sufficiently differentiable and that the domain of estimation is a compact unit hypercube. We provide an alter...
متن کاملUniversal approximation theorem for Dirichlet series
The paper deals with an extension theorem by Costakis and Vlachou on simultaneous approximation for holomorphic function to the setting of Dirichlet series, which are absolutely convergent in the right half of the complex plane. The derivation operator used in the analytic case is substituted by a weighted backward shift operator in the Dirichlet case. We show the similarities and extensions in...
متن کاملAsymptotic properties of mixture-of-experts models
Abstract. The statistical properties of the likelihood ratio test statistic (LRTS) for mixture-of-expert models are addressed in this paper. This question is essential when estimating the number of experts in the model. Our purpose is to extend the existing results for mixtures (Liu and Shao, 2003) and mixtures of multilayer perceptrons (Olteanu and Rynkiewicz, 2008). In this paper we study a s...
متن کاملa study on thermodynamic models for simulation of 1,3 butadiene purification columns
attempts have been made to study the thermodynamic behavior of 1,3 butadiene purification columns with the aim of retrofitting those columns to more energy efficient separation schemes. 1,3 butadiene is purified in two columns in series through being separated from methyl acetylene and 1,2 butadiene in the first and second column respectively. comparisons have been made among different therm...
Mixture of Experts for Persian handwritten word recognition
This paper presents the results of Persian handwritten word recognition based on Mixture of Experts technique. In the basic form of ME the problem space is automatically divided into several subspaces for the experts, and the outputs of experts are combined by a gating network. In our proposed model, we used Mixture of Experts Multi Layered Perceptrons with Momentum term, in the classification ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Neural Computation
سال: 2016
ISSN: 0899-7667,1530-888X
DOI: 10.1162/neco_a_00892